196 research outputs found
Recommended from our members
Algorithms for network piecewise-linear programs
In this paper a subarea of Piecewise-Linear Programming named network Piecewise-Linear Programming (NPLP) is discussed. Initially the problem formulation, main efinitins and related Concepts are presented. In the sequence of the paper, four specialized algorithms for NPLP, as well as the results of a preliminary computational study, are presented
Scale-free movement patterns in termites emerge from social interactions and preferential attachments
As the number or density of interacting individuals in a social group increases, a transition can develop from uncorrelated and disordered behaviour of the individuals to a collective coherent pattern. We expand this observation by exploring the fine details of termite movement patterns to demonstrate that the value of the scaling exponent µ of a power-law describing the Lévy walk of an individual is modified collectively as the density of animals in the group changes. This effect is absent when termites interact with inert obstacles. We also show that the network of encounters and interactions among specific individuals is selective resembling a preferential attachment mechanism which is important for social networking. TeOur data suggest strongly that preferential attachments, a phenomenon not reported previously, and favourite interactions with a limited number of acquaintances are responsible for the generation of Lévy movement patterns in these social insects
Atividade de micro-organismos solubilizadores de fosfato de ferro em resíduo industrial.
FertBio 2016
Cosmoglobe: Towards end-to-end CMB cosmological parameter estimation without likelihood approximations
We implement support for a cosmological parameter estimation algorithm as
proposed by Racine et al. (2016) in Commander, and quantify its computational
efficiency and cost. For a semi-realistic simulation similar to Planck LFI 70
GHz, we find that the computational cost of producing one single sample is
about 60 CPU-hours and that the typical Markov chain correlation length is
100 samples. The net effective cost per independent sample is 6 000
CPU-hours, in comparison with all low-level processing costs of 812 CPU-hours
for Planck LFI and WMAP in Cosmoglobe Data Release 1. Thus, although
technically possible to run already in its current state, future work should
aim to reduce the effective cost per independent sample by at least one order
of magnitude to avoid excessive runtimes, for instance through multi-grid
preconditioners and/or derivative-based Markov chain sampling schemes. This
work demonstrates the computational feasibility of true Bayesian cosmological
parameter estimation with end-to-end error propagation for high-precision CMB
experiments without likelihood approximations, but it also highlights the need
for additional optimizations before it is ready for full production-level
analysis.Comment: 10 pages, 8 figures. Submitted to A&
Cosmoglobe DR1 results. I. Improved Wilkinson Microwave Anisotropy Probe maps through Bayesian end-to-end analysis
We present Cosmoglobe Data Release 1, which implements the first joint
analysis of WMAP and Planck LFI time-ordered data, processed within a single
Bayesian end-to-end framework. This framework builds directly on a similar
analysis of the LFI measurements by the BeyondPlanck collaboration, and
approaches the CMB analysis challenge through Gibbs sampling of a global
posterior distribution, simultaneously accounting for calibration, mapmaking,
and component separation. The computational cost of producing one complete
WMAP+LFI Gibbs sample is 812 CPU-hr, of which 603 CPU-hrs are spent on WMAP
low-level processing; this demonstrates that end-to-end Bayesian analysis of
the WMAP data is computationally feasible. We find that our WMAP posterior mean
temperature sky maps and CMB temperature power spectrum are largely consistent
with the official WMAP9 results. Perhaps the most notable difference is that
our CMB dipole amplitude is , which is $11\
\mathrm{\mu K}2.5\ {\sigma}$ higher than
BeyondPlanck; however, it is in perfect agreement with the HFI-dominated Planck
PR4 result. In contrast, our WMAP polarization maps differ more notably from
the WMAP9 results, and in general exhibit significantly lower large-scale
residuals. We attribute this to a better constrained gain and transmission
imbalance model. It is particularly noteworthy that the W-band polarization sky
map, which was excluded from the official WMAP cosmological analysis, for the
first time appears visually consistent with the V-band sky map. Similarly, the
long standing discrepancy between the WMAP K-band and LFI 30 GHz maps is
finally resolved, and the difference between the two maps appears consistent
with instrumental noise at high Galactic latitudes. All maps and the associated
code are made publicly available through the Cosmoglobe web page.Comment: 65 pages, 61 figures. Data available at cosmoglobe.uio.no. Submitted
to A&
The BINGO project: IV. Simulations for mission performance assessment and preliminary component separation steps
Aims. The large-scale distribution of neutral hydrogen (HI) in the Universe is luminous through its 21 cm emission. The goal of the Baryon Acoustic Oscillations from Integrated Neutral Gas Observations (BINGO) radio telescope is to detect baryon acoustic oscillations at radio frequencies through 21 cm intensity mapping (IM). The telescope will span the redshift range 0.127<z<0.449 with an instantaneous field-of-view of 14.75 - 6.0. Methods. In this work we investigate different constructive and operational scenarios of the instrument by generating sky maps as they would be produced by the instrument. In doing this we use a set of end-to-end IM mission simulations. The maps will additionally be used to evaluate the efficiency of a component separation method (GNILC). Results. We have simulated the kind of data that would be produced in a single-dish IM experiment such as BINGO. According to the results obtained, we have optimized the focal plane design of the telescope. In addition, the application of the GNILC method on simulated data shows that it is feasible to extract the cosmological signal across a wide range of multipoles and redshifts. The results are comparable with the standard principal component analysis method
- …